34 research outputs found
Interpolatory methods for model reduction of multi-input/multi-output systems
We develop here a computationally effective approach for producing
high-quality -approximations to large scale linear
dynamical systems having multiple inputs and multiple outputs (MIMO). We extend
an approach for model reduction introduced by Flagg,
Beattie, and Gugercin for the single-input/single-output (SISO) setting, which
combined ideas originating in interpolatory -optimal model
reduction with complex Chebyshev approximation. Retaining this framework, our
approach to the MIMO problem has its principal computational cost dominated by
(sparse) linear solves, and so it can remain an effective strategy in many
large-scale settings. We are able to avoid computationally demanding
norm calculations that are normally required to monitor
progress within each optimization cycle through the use of "data-driven"
rational approximations that are built upon previously computed function
samples. Numerical examples are included that illustrate our approach. We
produce high fidelity reduced models having consistently better
performance than models produced via balanced truncation;
these models often are as good as (and occasionally better than) models
produced using optimal Hankel norm approximation as well. In all cases
considered, the method described here produces reduced models at far lower cost
than is possible with either balanced truncation or optimal Hankel norm
approximation
DELMU: A Deep Learning Approach to Maximising the Utility of Virtualised Millimetre-Wave Backhauls
Advances in network programmability enable operators to 'slice' the physical
infrastructure into independent logical networks. By this approach, each
network slice aims to accommodate the demands of increasingly diverse services.
However, precise allocation of resources to slices across future 5G
millimetre-wave backhaul networks, to optimise the total network utility, is
challenging. This is because the performance of different services often
depends on conflicting requirements, including bandwidth, sensitivity to delay,
or the monetary value of the traffic incurred. In this paper, we put forward a
general rate utility framework for slicing mm-wave backhaul links, encompassing
all known types of service utilities, i.e. logarithmic, sigmoid, polynomial,
and linear. We then introduce DELMU, a deep learning solution that tackles the
complexity of optimising non-convex objective functions built upon arbitrary
combinations of such utilities. Specifically, by employing a stack of
convolutional blocks, DELMU can learn correlations between traffic demands and
achievable optimal rate assignments. We further regulate the inferences made by
the neural network through a simple 'sanity check' routine, which guarantees
both flow rate admissibility within the network's capacity region and minimum
service levels. The proposed method can be trained within minutes, following
which it computes rate allocations that match those obtained with
state-of-the-art global optimisation algorithms, yet orders of magnitude
faster. This confirms the applicability of DELMU to highly dynamic traffic
regimes and we demonstrate up to 62% network utility gains over a baseline
greedy approach.Comment: remove LaTeX remains in abstract; change the font for acrony